20 research outputs found

    A semantic approach to enable data integration for the domain of flood risk management

    Get PDF
    With so many things around us continuously producing and processing data, be it mobile phones, or sensors attached to devices, or satellites sitting thousands of kilometres above our heads, data is becoming increasingly heterogeneous. Scientists are inevitably faced with data challenges, coined as the 4 V’s of data - volume, variety, velocity and veracity. In this paper, we address the issue of data variety. The task of integrating and querying such heterogeneous data is further compounded if the data is in unstructured form. We hence propose an approach using Semantic Web and Natural Language Processing techniques to resolve the heterogeneity arising in data formats, bring together structured and unstructured data and provide a unified data model to query from disparate data sets

    An Ontological Architecture for Principled and Automated System of Systems Composition

    Get PDF
    A distributed system's functionality must continuously evolve, especially when environmental context changes. Such required evolution imposes unbearable complexity on system development. An alternative is to make systems able to self-adapt by opportunistically composing at runtime to generate systems of systems (SoSs) that offer value-added functionality. The success of such an approach calls for abstracting the heterogeneity of systems and enabling the programmatic construction of SoSs with minimal developer intervention. We propose a general ontology-based approach to describe distributed systems, seeking to achieve abstraction and enable runtime reasoning between systems. We also propose an architecture for systems that utilize such ontologies to enable systems to discover and `understand' each other, and potentially compose, all at runtime. We detail features of the ontology and the architecture through two contrasting case studies. We also quantitatively evaluate the scalability and validity of our approach through experiments and simulations. Our approach enables system developers to focus on high-level SoS composition without being tied down with the specific deployment-specific implementation details

    Intermediate CONNECT Architecture

    Get PDF
    Interoperability remains a fundamental challenge when connecting heterogeneous systems which encounter and spontaneously communicate with one another in pervasive computing environments. This challenge is exasperated by the highly heterogeneous technologies employed by each of the interacting parties, i.e., in terms of hardware, operating system, middleware protocols, and application protocols. The key aim of the CONNECT project is to drop this heterogeneity barrier and achieve universal interoperability. Here we report on the activities of WP1 into developing the CONNECT architecture that will underpin this solution. In this respect, we present the following key contributions from the second year. Firstly, the intermediary CONNECT architecture that presents a more concrete view of the technologies and principles employed to enable interoperability between heterogeneous networked systems. Secondly, the design and implementation of the discovery enabler with emphasis on the approaches taken to match compatible networked systems. Thirdly, the realisation of CONNECTors that can be deployed in the environment; we provide domain specific language solutions to generate and translate between middleware protocols. Fourthly, we highlight the role of ontologies within CONNECT and demonstrate how ontologies crosscut all functionality within the CONNECT architecture

    Models of everywhere revisited: a technological perspective

    Get PDF
    The concept ‘models of everywhere’ was first introduced in the mid 2000s as a means of reasoning about the environmental science of a place, changing the nature of the underlying modelling process, from one in which general model structures are used to one in which modelling becomes a learning process about specific places, in particular capturing the idiosyncrasies of that place. At one level, this is a straightforward concept, but at another it is a rich multi-dimensional conceptual framework involving the following key dimensions: models of everywhere, models of everything and models at all times, being constantly re-evaluated against the most current evidence. This is a compelling approach with the potential to deal with epistemic uncertainties and nonlinearities. However, the approach has, as yet, not been fully utilised or explored. This paper examines the concept of models of everywhere in the light of recent advances in technology. The paper argues that, when first proposed, technology was a limiting factor but now, with advances in areas such as Internet of Things, cloud computing and data analytics, many of the barriers have been alleviated. Consequently, it is timely to look again at the concept of models of everywhere in practical conditions as part of a trans-disciplinary effort to tackle the remaining research questions. The paper concludes by identifying the key elements of a research agenda that should underpin such experimentation and deployment

    The design and deployment of an end-to-end IoT infrastructure for the natural environment

    Get PDF
    Internet of Things (IoT) systems have seen recent growth in popularity for city and home environments. We report on the design, deployment, and use of the IoT infrastructure for environmental monitoring and management. Working closely with hydrologists, soil scientists, and animal behaviour scientists, we successfully deployed and utilised a system to deliver integrated information across these two fields in the first such example of real-time multidimensional environmental science. We describe the design of this system; its requirements and operational effectiveness for hydrological, soil, and ethological scientists; and our experiences from building, maintaining, and using the deployment at a remote site in difficult conditions. Based on this experience, we discuss key future work for the IoT community when working in these kinds of environmental deployments

    Revised CONNECT Architecture

    Get PDF
    Interoperability remains a fundamental challenge when connecting heterogeneous systems which encounter and spontaneously communicate with one another in pervasive computing environments. This challenge is exasperated by the highly heterogeneous technologies employed by each of the interacting parties, i.e., in terms of hardware, operating system, middleware protocols, and application protocols. The key aim of the CONNECT project is to drop this heterogeneity barrier and achieve universal interoperability. Here we report on the revised CONNECT architecture, highlighting the integration of the work carried out to integrate the CONNECT enablers developed by the different partners; in particular, we present the progress of this work towards a finalised concrete architecture. In the third year this architecture has been enhanced to: i) produce concrete CONNECTors, ii) match networked systems based upon their goals and intent, and iii) use learning technologies to find the affordance of a system. We also report on the application of the CONNECT approach to streaming based systems, further considering exploitation of CONNECT in the mobile environment

    Final CONNECT Architecture

    Get PDF
    Interoperability remains a fundamental challenge when connecting heterogeneous systems which encounter and spontaneously communicate with one another in pervasive computing environments. This challenge is exasperated by the highly heterogeneous technologies employed by each of the interacting parties, i.e., in terms of hardware, operating system, middleware protocols, and application protocols. The key aim of the CONNECT project is to drop this heterogeneity barrier and achieve universal interoperability. Here we report on the revised CONNECT architecture, highlighting the integration of the work carried out to integrate the CONNECT enablers developed by the different partners; in particular, we present the progress of this work towards a finalised concrete architecture. In the third year this architecture has been enhanced to: i) produce concrete CONNECTors, ii) match networked systems based upon their goals and intent, and iii) use learning technologies to find the affordance of a system. We also report on the application of the CONNECT approach to streaming based systems, further considering exploitation of CONNECT in the mobile environment

    Rethinking data‐driven decision support in flood risk management for a big data age

    Get PDF
    Decision‐making in flood risk management is increasingly dependent on access to data, with the availability of data increasing dramatically in recent years. We are therefore moving towards an era of big data, with the added challenges that, in this area, data sources are highly heterogeneous, at a variety of scales, and include a mix of structured and unstructured data. The key requirement is therefore one of integration and subsequent analyses of this complex web of data. This paper examines the potential of a data‐driven approach to support decision‐making in flood risk management, with the goal of investigating a suitable software architecture and associated set of techniques to support a more data‐centric approach. The key contribution of the paper is a cloud‐based data hypercube that achieves the desired level of integration of highly complex data. This hypercube builds on innovations in cloud services for data storage, semantic enrichment and querying, and also features the use of notebook technologies to support open and collaborative scenario analyses in support of decision making. The paper also highlights the success of our agile methodology in weaving together cross‐disciplinary perspectives and in engaging a wide range of stakeholders in exploring possible technological futures for flood risk management

    The Role of Digital Technologies in Responding to the Grand Challenges of the Natural Environment:The Windermere Accord

    Get PDF
    Digital technology is having a major impact on many areas of society, and there is equal opportunity for impact on science. This is particularly true in the environmental sciences as we seek to understand the complexities of the natural environment under climate change. This perspective presents the outcomes of a summit in this area, a unique cross-disciplinary gathering bringing together environmental scientists, data scientists, computer scientists, social scientists, and representatives of the creative arts. The key output of this workshop is an agreed vision in the form of a framework and associated roadmap, captured in the Windermere Accord. This accord envisions a new kind of environmental science underpinned by unprecedented amounts of data, with technological advances leading to breakthroughs in taming uncertainty and complexity, and also supporting openness, transparency, and reproducibility in science. The perspective also includes a call to build an international community working in this important area

    A dynamic interoperability model for an emergent middleware framework

    No full text
    The rapid changing world of computing has sparked off a major increase in the complexity, heterogeneity and dynamicity of distributed systems. Consequently, standard middleware platforms are unable to cope with the extreme heterogeneity and dynamicity of this new generation of distributed systems. Furthermore, given new trends in mobile/pervasive applications, distributed systems are required to connect to one another at run time, implying that heterogeneities arising in systems need to be resolved on the fly. This ability of a system to interact with a different system is known as interoperability. Existing middleware interoperability solutions cannot deal with dynamic interoperability because of their static and hand-crafted nature. Hence, more advanced solutions that exceed the state-of-the-art in middleware, are required to handle interoperability on the fly. This thesis investigates the challenges of dynamic interoperability and how to devise an emergent middleware to enable such dynamic interoperation. To overcome the heterogeneities arising at runtime, the thesis also investigates the approach of the Semantic Web community to employ semantic reasoning of concepts at the application level. The thesis maintains that one such notable contribution of this community, the use of ontologies, has proved to play a significant role in the set up of such an emergent middleware framework. As a result, the thesis proposes a framework with 3 distinct design principles - matching, classifying and mapping – to tackle dynamic interoperability at the message level of systems, and also highlights the cross-cutting role played by ontologies in the framework. The experimental evaluation of the framework shows the framework is able to tackle the heterogeneity arising in messages at runtime, and also highlights the significance of linguistic techniques in assisting ontologies at the matching stage. Finally, the performance evaluation denotes how the framework behaves at runtime and justifies how the framework performs its intended purpose
    corecore